Zero-NAV: Democratising aerial navigation via robust and data scalable computer vision

The value of research shines best when it touches everyday life. Today's scientific journals in Computer Vision (CV) provide exciting developments that promise a future in which autonomous systems can democratise and automise navigation,control of ground and aerial systems. Academic advancement however are focusing more and more on developing theoretical solutions optimised to perform over a given data domain - a set of real camera images collected at specific location, parameters, weather, season date etc. This poses a challenge in democratisation of such techniques to real world applications such as aerial navigation. As rizing countermeasure synthetic data and rendering engines have attempted to fill in the data hunger of the CV models. Two paths exist in bridging the gap between the real scarese data and abundful synthetic: Either making the synthetic data highly representative of reality or making models robust to sim-to-real transfer.

Fig:1 Through the Looking-Glass, Alice Pushes Through the Mirror (between real and synthetic data)

We believe that the real-challenge of today's CV is to develop and validate algorithms which can work robustly and in an economically/data scalable way. 

Do you believe like us in pushing CV advances in the real world? Will you be a champion in proving CV advances benefits to the world of aviation? Then joins us and earn:

→ Respect and fame: As you, your team product or company being the leader in the real-world challenge of aerial navigation. Leaderboard open for all with no restrictions.

The champions prize: 10000 CHF (8000 CHF for 1st place and 2000 CHF) for second. For the champions that chose to share to the world (open source) their solution for the benefit of all.

The challenge, should you choose to accept it..

Today aerial autonomous systems for navigation and control are heavily dependent on robustness of GNSS reception. Lack of thereof (terrain, weather or adversarial spoofing) can lead loss of autonomous system absolute orientation in the medium to long run and be unsafe for operation on beyond line of sight missions. Despite significant progress in Computer Vision, most learning-based approaches target at a single domain and require a dense database of geo-tagged images to function well. Or at least a calibration set of real images taken closely from the domain of [1,2] operation. Several industry attempts are made in the same direction however, understandably without official benchmark or validation.

This competition is about finding the best methods to localise the query image by predicting its scene coordinates, and computing the accurate 6D camera pose. To do this by having only access to: a) synthetic rendered data/ques (scene coordinates, depth, semantics..) related GT poses for the navigation area of interest. b) a subset of synthetic and real image pairs taken from similar looking place but of geographically different location in aid with the sim-to-real transfer.

+ Uncertainty

This competition is based on the crossloc project from the Geodetic Engineering Laboratory of the EPFL.

This is the official benchmark for this competition and you can learn more about the approach here but you are not limited and can try any other. Can you beat it?

Organisation

Datasets

Intro explanation

  1. Training-navigation (In place LHS):

  1. Matching pairs (Out of place)

Data set name

Type

Number of images

Link

Training-navigation (in place-LHS)

zero-shot

 15000

Matching pairs

Sim-to-Real

1197

Schedule

Organization committee

Scientific supervisor : Iordan Doytchinov Profile

Technical manager : Régis Longchamp Profile

Prizes

See the terms and conditions for further details

References

[1] Previous related project website (coordinate scene regression based architecture)

[2] Competing State of the art architecture (feature regression based localisation)

Submission Format

To submit your results to the leaderboard you must construct a submission zip file containing the following CSV files :

Edit those!

CSV files must have the following columns (without column names) :

Coordinates are in WGS84 coordinate system

Evaluation Criterias (to be updated!) 

Accuracy on positioning (70 %) Providing a meaningful uncertainty statement (30 %). If Uncertainty statement is provided, competitor will be judged on the median of their lowest uncertainty 5 % images.

Terms and Conditions

Data

  1. shall only be downloaded if you agree to these terms
  2. is to be used only for the academic purposes
  3. will not be used for commercial purposes
  4. will not be transferred to any third party

Prizes attribution

The two highest scores that beats the Crossloc algorithms will be considered as winners

Winners aggre to :

  1. Transmit their code for evaluation purposes.
  2. Document their code in order to be run easily.

Zero-NAV: Democratising aerial navigation via robust and data scalable computer vision

Due to vulnerabilities in the reception of satellite positioning (GNSS) signals, on which aerial systems rely for navigation and controls, alternative methods are in demand for absolute large-scale localization. Low-cost cameras have become more popular in aerial systems and these devices can be used as sensors for capturing information on the surrounding landscape. When features of a known environment are recognized in the captured images, these an be used to determine the absolute camera poses.

Challenge

This competition is about finding the best methods to localize the query image by predicting its scene coordinates, and computing the accurate 6D camera pose.


This competition is based on the crossloc project from the Geodetic Engineering Laboratory of the EPFL.
This is the official benchmark for this competition and you can learn more about the approach here but you are not limited and can try any other. Can you beat it?

Organisation

Dataset

Data set name Method Type Number of images Link
urbanscape Few shots Train   here
urbanscape Few shots Validation    
mixscapeHD Zero shot Train    
mixscapeHD Zero shot Validation    

Schedule

Organization committee

Scientific supervisor : Iordan Doytchinov Profile
Technical manager : Régis Longchamp Profile

Prizes

See the terms and conditions for further details

References

Project website